Probability Interval Partitioning Entropy Codes
نویسنده
چکیده
A novel approach to entropy coding is described that provides the coding efficiency and simple probability modeling capability of arithmetic coding at the complexity level of Huffman coding. The key element of the proposed approach is given by a partitioning of the unit interval into a small set of disjoint probability intervals for pipelining the coding process along the probability estimates of binary random variables. According to this partitioning, an input sequence of discrete source symbols with arbitrary alphabet sizes is mapped to a sequence of binary symbols and each of the binary symbols is assigned to one particular probability interval. With each of the intervals being represented by a fixed probability, the probability interval partitioning entropy (PIPE) coding process is based on the design and application of simple variable-to-variable length codes. Probability modeling can either be fixed or adaptive while the PIPE code for each probability interval remains fixed and is decoupled from the modeling stage. The excess rate of PIPE coding is shown to be comparable to that of arithmetic coding.
منابع مشابه
Equal-image-size source partitioning: Creating strong Fano's inequalities for multi-terminal discrete memoryless channels
This paper introduces equal-image-size source partitioning, a new tool for analyzing channel and joint source-channel coding in a multi-terminal discrete memoryless channel environment. Equal-image-size source partitioning divides the source (combination of messages and codewords) into a sub-exponential number of subsets. Over each of these subsets, the exponential orders of the minimum image s...
متن کاملErgodic Actions of Countable Groups and Finite Generating Partitions
We prove the following finite generator theorem. Let G be a countable group acting ergodically on a standard probability space. Suppose this action admits a generating partition having finite Shannon entropy. Then the action admits a finite generating partition. We also discuss relationships between generating partitions and f-invariant and sofic entropies.
متن کاملNeuronal Goals: E cient Coding and Coincidence Detection
| Barlow's seminal work on minimal entropy codes and unsupervised learning is reiterated. In particular, the need to transmit the probability of events is put in a practical neuronal framework for detecting suspicious events. A variant of the BCM learning rule 15] is presented together with some mathematical results suggesting optimal minimal entropy coding.
متن کاملDetermination of Maximum Bayesian Entropy Probability Distribution
In this paper, we consider the determination methods of maximum entropy multivariate distributions with given prior under the constraints, that the marginal distributions or the marginals and covariance matrix are prescribed. Next, some numerical solutions are considered for the cases of unavailable closed form of solutions. Finally, these methods are illustrated via some numerical examples.
متن کاملIntroduction to Data Compression
3 Probability Coding 10 3.1 Prefix Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 3.1.1 Relationship to Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.2 Huffman Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 3.2.1 Combining Messages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 3.2.2 Minim...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010